An Efficient Gaussian Mixture Reduction to Two Components

نویسندگان

  • Naoya Yokoyama
  • Daiki Azuma
  • Shuji Tsukiyama
چکیده

In statistical methods, such as statistical static timing analysis, Gaussian mixture model (GMM) is a useful tool for representing a non-Gaussian distribution and handling correlation easily. In order to repeat various statistical operations such as summation and maximum for GMMs efficiently, the number of components should be restricted around two. In this paper, we propose a method for reducing the number of components of a given GMM to two (2-GMM) such that the mean and the variance of the 2-GMM are equal to those of original GMM and the normalized integral square error of 2-GMM PDF is minimized. In order to demonstrate the performance of the proposed methods, we show some experimental results.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Superficial Gaussian Mixture Reduction

Many information fusion tasks involve the processing of Gaussian mixtures with simple underlying shape, but many components. This paper addresses the problem of reducing the number of components, allowing for faster density processing. The proposed approach is based on identifying components irrelevant for the overall density’s shape by means of the curvature of the density’s surface. The key i...

متن کامل

Negative Selection Based Data Classification with Flexible Boundaries

One of the most important artificial immune algorithms is negative selection algorithm, which is an anomaly detection and pattern recognition technique; however, recent research has shown the successful application of this algorithm in data classification. Most of the negative selection methods consider deterministic boundaries to distinguish between self and non-self-spaces. In this paper, two...

متن کامل

Gaussian Mixture Reduction Using Reverse Kullback-Leibler Divergence

We propose a greedy mixture reduction algorithm which is capable of pruning mixture components as well as merging them based on the Kullback-Leibler divergence (KLD). The algorithm is distinct from the well-known Runnalls’ KLD based method since it is not restricted to merging operations. The capability of pruning (in addition to merging) gives the algorithm the ability of preserving the peaks ...

متن کامل

On the number of Gaussian components in a mixture: an application to speaker verification tasks

Despite all advances in the speaker recognition domain, Gaussian Mixture Models (GMM) remain the state-of-the-art modeling technique in speaker recognition systems. The key idea is to approximate the probability density function ( ) of the feature vectors associated to a speaker with a weighted sum of Gaussian densities. Although the extremely efficient Expectation-Maximization (EM) algorithm c...

متن کامل

An Efficient Method for Model Reduction in Diffuse Optical Tomography

We present an efficient method for the reduction of model equations in the linearized diffuse optical tomography (DOT) problem. We first implement the maximum a posteriori (MAP) estimator and Tikhonov regularization, which are based on applying preconditioners to linear perturbation equations. For model reduction, the precondition is split into two parts: the principal components are consid...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2016